Comparing MaxEnt and Noisy Harmonic Grammar

نویسندگان

چکیده

MaxEnt grammar is a probabilistic version of Harmonic Grammar in which the harmony scores candidates are mapped onto probabilities. It has become tool choice for analyzing phonological phenomena involving variation or gradient acceptability, but there competing proposal making probabilistic, Noisy Grammar, derived by adding random ‘noise’ to constraint weights. In this paper these frameworks, and variants them, analyzed reformulating them all format where noise added candidate harmonies, differences between frameworks lie distribution noise. This analysis reveals basic difference models: relative probabilities two depend only on their scores, whereas it also depends violations incurred candidates. leads testable predictions evaluated against data variable realization schwa French (Smith & Pater 2020). The results support over Grammar.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Varieties of Noisy Harmonic Grammar

Noisy Harmonic Grammar (NHG) is a framework for stochastic grammars that uses the GEN-cum-EVAL system originated in Optimality Theory. As a form of Harmonic Grammar, NHG outputs as winner the candidate with the smallest harmonic penalty (weighted sum of constraint violations). It is stochastic because at each “evaluation time,” constraint weights are nudged upward or downward by a random amount...

متن کامل

Harmonic Grammar with Harmonic Serialism∗

Loss of restrictiveness through arithmetic: Concern is well-founded here. As we have shown, however, recourse to the full-blown power of numerical optimization is not required. . . In Optimality Theory, constraints are ranked, not weighted: harmonic evaluation involves the abstract algebra of order relations rather than numerical adjudication between quantities. ∗This research would not have be...

متن کامل

Does MaxEnt Overgenerate? Implicational Universals in Maximum Entropy Grammar

It goes without saying that a good linguistic theory should neither undergenerate (i.e., it should not miss any attested patterns) nor overgenerate (i.e., it should not predict any “unattestable” patterns). Recent literature has argued that the Maximum Entropy (ME; Goldwater & Johnson 2003) framework provides a probabilistic extension of categorical Harmonic Grammar (HG; Legendre et al. 1990; S...

متن کامل

Harmonic grammar with linear programming ∗

We show that Harmonic Grammars (HGs) translate into linear systems and are thus solvable using the simplex algorithm, an efficient, widely-deployed optimization algorithm that is guaranteed to deliver the optimal solution if there is one and to detect when no solution exists. Our associated software package HaLP provides a practical tool for studying even large and complex HGs. We provide an in...

متن کامل

Harmonic Grammar, Gradual Learning, and Phonological Gradience

(1) i. HG is (perhaps surprisingly) restrictive, due to inherent limitations on the types of languages that can be generated by an optimization system (Bhatt et al. 2007; Pater et al. 2007) ii. HG is compatible with a simple correctly convergent gradual learning algorithm, the Perceptron algorithm of Rosenblatt (1958) (Boersma and Pater 2007; Pater 2007a; see Jäger 2006, Soderstrom et al. 2006 ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Glossa

سال: 2021

ISSN: ['2397-1835']

DOI: https://doi.org/10.16995/glossa.5775